Multiple Task Learning Using Iteratively Reweighted Least Square

نویسندگان

  • Jian Pu
  • Yu-Gang Jiang
  • Jun Wang
  • Xiangyang Xue
چکیده

Multiple task learning (MTL) is becoming popular due to its theoretical advances and empirical successes. The key idea of MTL is to explore the hidden relationships among multiple tasks to enhance learning performance. Recently, many MTL algorithms have been developed and applied to various problems such as feature selection and kernel learning. However, most existing methods highly relied on certain assumptions of the task relationships. For instance, several works assumed that there is a major task group and several outlier tasks, and used a decomposition approach to identify the group structure and outlier tasks simultaneously. In this paper, we adopt a more general formulation for MTL without making specific structure assumptions. Instead of performing model decomposition, we directly impose an elastic-net regularization with a mixture of the structure and outlier penalties and formulate the objective as an unconstrained convex problem. To derive the optimal solution efficiently, we propose to use an Iteratively Reweighted Least Square (IRLS) method with a preconditioned conjugate gradient, which is computationally affordable for high dimensional data. Extensive experiments are conducted over both synthetic and real data, and comparisons with several state-of-the-art algorithms clearly show the superior performance of the proposed method.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multiple task learning with flexible structure regularization

Due to the theoretical advances and empirical successes, Multi-task Learning (MTL) has become a popular design paradigm for training a set of tasks jointly. Through exploring the hidden relationships among multiple tasks, many MTL algorithms have been developed to enhance learning performance. In general, the complicated hidden relationships can be considered as a combination of two key structu...

متن کامل

An Iteratively Reweighted Least Square Implementation for Face Recognition

We propose, as an alternative to current face recognition paradigms, an algorithm using reweighted l2 minimization, whose recognition rates are not only comparable to the random projection using l1 minimization compressive sensing method of Yang et al [5], but also robust to occlusion. Through numerical experiments, reweighted l2 mirrors the l1 solution [1] even with occlusion. Moreover, we pre...

متن کامل

Iteratively reweighted LASSO for mapping multiple quantitative trait loci

The iteratively reweighted least square (IRLS) method is mostly identical to maximum likelihood (ML) method in terms of parameter estimation and power of quantitative trait locus (QTL) detection. But the IRLS is greatly superior to ML in terms of computing speed and the robustness of parameter estimation. In conjunction with the priors of parameters, ML can analyze multiple QTL model based on B...

متن کامل

Credit Risk Classification Using Kernel Logistic Regression-least Square Support Vector Machine

Kernel Logistic Regression (KLR) is one of the statistical models that have been proposed for classification in the machine learning and data mining communities, and also one of the effective methodologies in the kernel-machine techniques. The parameters of KLR model are usually fitted by the solution of a convex optimization problem that can be found using the well known Iteratively Reweighted...

متن کامل

Fast General Norm Approximation via Iteratively Reweighted Least Squares

This paper describes an efficient method for general norm approximation that appears frequently in various computer vision problems. Such a lot of problems are differently formulated, but frequently require to minimize the sum of weighted norms as the general norm approximation. Therefore we extend Iteratively Reweighted Least Squares (IRLS) that is originally for minimizing single norm. The pr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013